Analog CMOS implementation of a multi-layer perceptron with nonlinear synapses

نویسنده

  • Jerzy B. Lont
چکیده

A neurocomputer based on a high-density analog integrated circuit developed in a 3 mum CMOS technology has been built. The 1.6 mmx2.4 mm chip contains 18 neurons and 161 synapses in three layers, and provides 16 inputs and 4 outputs. The weights are stored on storage capacitors of the synapses. A formalization of the error back-propagation algorithm which allows the use of very small nonlinear synapses is shown. The influence of offset voltages in the synapses on the circuit performance is analyzed. Some experimental results are reported and discussed.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Analog Multiplier for Feed Forward Neural Network Signal Processing

In this paper, a new implementation of Four-Quadrant CMOS Analog Multiplier Circuit for feed forward neural networks multi layer perceptron operation will be proposed. The proposed multiplier will be divided into two or three parts, which will be inside the input, the synapse and the neuron. The main characteristics of the proposed circuit are the small silicon area and the low power consumptio...

متن کامل

Design and Non-linear Modelling of CMOS Multipliers for Analog VLSI Implementation of Neural Algorithms

The analog VLSI implementation looks an attractive way for implementing Artiicial Neural Networks; in fact, it gives small area, low power consumption and compact design of neural computational primitive circuits. On the other hand, major drawbacks result to be the low computational accuracy and the non-linear behaviour of analog circuits. In this paper, we present the design and the detailed b...

متن کامل

Neural Computation with Winner-Take-All as the Only Nonlinear Operation

Everybody “knows” that neural networks need more than a single layer of nonlinear units to compute interesting functions. We show that this is false if one employs winner-take-all as nonlinear unit: Any boolean function can be computed by a single -winner-takeall unit applied to weighted sums of the input variables. Any continuous function can be approximated arbitrarily well by a single soft w...

متن کامل

A TS Fuzzy Model Derived from a Typical Multi-Layer Perceptron

In this paper, we introduce a Takagi-Sugeno (TS) fuzzy model which is derived from a typical Multi-Layer Perceptron Neural Network (MLP NN). At first, it is shown that the considered MLP NN can be interpreted as a variety of TS fuzzy model. It is discussed that the utilized Membership Function (MF) in such TS fuzzy model, despite its flexible structure, has some major restrictions. After modify...

متن کامل

New full adders using multi-layer perceptron network

How to reconfigure a logic gate for a variety of functions is an interesting topic. In this paper, a different method of designing logic gates are proposed. Initially, due to the training ability of the multilayer perceptron neural network, it was used to create a new type of logic and full adder gates. In this method, the perceptron network was trained and then tested. This network was 100% ac...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 3 3  شماره 

صفحات  -

تاریخ انتشار 1992